Block coordinate descent algorithms for large-scale sparse multiclass classification
نویسندگان
چکیده
منابع مشابه
Block Coordinate Descent for Sparse NMF
Nonnegative matrix factorization (NMF) has become a ubiquitous tool for data analysis. An important variant is the sparse NMF problem which arises when we explicitly require the learnt features to be sparse. A natural measure of sparsity is the L0 norm, however its optimization is NP-hard. Mixed norms, such as L1/L2 measure, have been shown to model sparsity robustly, based on intuitive attribu...
متن کاملGreedy Block Coordinate Descent for Large Scale Gaussian Process Regression
We propose a variable decomposition algorithm– greedy block coordinate descent (GBCD)–in order to make dense Gaussian process regression practical for large scale problems. GBCD breaks a large scale optimization into a series of small sub-problems. The challenge in variable decomposition algorithms is the identification of a subproblem (the active set of variables) that yields the largest impro...
متن کاملA Block-Coordinate Descent Approach for Large-scale Sparse Inverse Covariance Estimation
The sparse inverse covariance estimation problem arises in many statistical applications in machine learning and signal processing. In this problem, the inverse of a covariance matrix of a multivariate normal distribution is estimated, assuming that it is sparse. An `1 regularized log-determinant optimization problem is typically solved to approximate such matrices. Because of memory limitation...
متن کاملLarge Scale Kernel Learning using Block Coordinate Descent
We demonstrate that distributed block coordinate descent can quickly solve kernel regression and classification problems with millions of data points. Armed with this capability, we conduct a thorough comparison between the full kernel, the Nyström method, and random features on three large classification tasks from various domains. Our results suggest that the Nyström method generally achieves...
متن کاملFrugal Coordinate Descent for Large-Scale NNLS
The Nonnegative Least Squares (NNLS) formulation arises in many important regression problems. We present a novel coordinate descent method which differs from previous approaches in that we do not explicitly maintain complete gradient information. Empirical evidence shows that our approach outperforms a state-of-the-art NNLS solver in computation time for calculating radiation dosage for cancer...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Machine Learning
سال: 2013
ISSN: 0885-6125,1573-0565
DOI: 10.1007/s10994-013-5367-2